70 research outputs found

    Designing Vibrotactile Widgets with Printed Actuators and Sensors

    Get PDF
    Physical controls are fabricated through complicated assembly of parts requiring expensive machinery and are prone to mechanical wear. One solution is to embed controls directly in interactive surfaces, but the proprioceptive part of gestural interaction that makes physical controls discoverable and usable solely by hand gestures is lost and has to be compensated, by vibrotactile feedback for instance. Vibrotactile actuators face the same aforementioned issues as for physical controls. We propose printed vibrotactile actuators and sensors. They are printed on plastic sheets, with piezoelectric ink for actuation, and with silver ink for conductive elements, such as wires and capacitive sensors. These printed actuators and sensors make it possible to design vibrotactile widgets on curved surfaces, without complicated mechanical assembly

    Data Changes Everything: Challenges and Opportunities in Data Visualization Design Handoff

    Full text link
    Complex data visualization design projects often entail collaboration between people with different visualization-related skills. For example, many teams include both designers who create new visualization designs and developers who implement the resulting visualization software. We identify gaps between data characterization tools, visualization design tools, and development platforms that pose challenges for designer-developer teams working to create new data visualizations. While it is common for commercial interaction design tools to support collaboration between designers and developers, creating data visualizations poses several unique challenges that are not supported by current tools. In particular, visualization designers must characterize and build an understanding of the underlying data, then specify layouts, data encodings, and other data-driven parameters that will be robust across many different data values. In larger teams, designers must also clearly communicate these mappings and their dependencies to developers, clients, and other collaborators. We report observations and reflections from five large multidisciplinary visualization design projects and highlight six data-specific visualization challenges for design specification and handoff. These challenges include adapting to changing data, anticipating edge cases in data, understanding technical challenges, articulating data-dependent interactions, communicating data mappings, and preserving the integrity of data mappings across iterations. Based on these observations, we identify opportunities for future tools for prototyping, testing, and communicating data-driven designs, which might contribute to more successful and collaborative data visualization design.Comment: 11 pages, 11 figures. To appear in IEEE Transactions on Visualization and Computer Graphics. To be presented at the IEEE VIS 2019 Conferenc

    Printgets: an Open-Source Toolbox for Designing Vibrotactile Widgets with Industrial-Grade Printed Actuators and Sensors

    Get PDF
    International audienceNew technologies for printing sensors and actuators combine the flexibility of interface layouts of touchscreens with localized vibrotactile feedback, but their fabrication still requires industrial-grade facilities. Until these technologies become easily replicable, interaction designers need material for ideation. We propose an open-source hardware and software toolbox providing maker-grade tools for iterative design of vibrotactile widgets with industrial-grade printed sensors and actuators. Our hardware toolbox provides a mechanical structure to clamp and stretch printed sheets, and electronic boards to drive sensors and actuators. Our software toolbox expands the design space of haptic interaction techniques by reusing the wide palette of available audio processing algorithms to generate real-time vibrotactile signals. We validate our toolbox with the implementation of three exemplar interface elements with tactile feedback: buttons, sliders, touchpads

    Investigating the Effect of Tactile Input and Output Locations for Drivers’ Hands on In-Car Tasks Performance

    Get PDF
    This paper reports a study investigating the effects of tactile input and output from the steering wheel and the centre console on non-driving task performance. While driving, participants were asked to perform list selection tasks using tactile switches and to experience tactile feedback on either the non-dominant, dominant or both hands as they were browsing the list. Our results show the average duration for selecting an item is 30% shorter when interacting with the steering wheel. They also show a 20% increase in performance when tactile feedback is provided. Our findings reveal that input prevails over output location when designing interaction for drivers. However, tactile feedback on the steering wheel is beneficial when provided at the same location as the input or to both hands. The results will help designers understand the trade-offs of using different interaction locations in the car

    HapBead: on-skin microfluidic haptic interface using tunable bead

    Get PDF
    On-skin haptic interfaces using soft elastomers which are thin and flexible have significantly improved in recent years. Many are focused on vibrotactile feedback that requires complicated parameter tuning. Another approach is based on mechanical forces created via piezoelectric devices and other methods for non-vibratory haptic sensations like stretching, twisting. These are often bulky with electronic components and associated drivers are complicated with limited control of timing and precision. This paper proposes HapBead, a new on-skin haptic interface that is capable of rendering vibration like tactile feedback using microfluidics. HapBead leverages a microfluidic channel to precisely and agilely oscillate a small bead via liquid flow, which then generates various motion patterns in channel that creates highly tunable haptic sensations on skin. We developed a proof-of-concept design to implement thin, flexible and easily affordable HapBead platform, and verified its haptic rendering capabilities via attaching it to users’ fingertips. A study was carried out and confirmed that participants could accurately tell six different haptic patterns rendered by HapBead. HapBead enables new wearable display applications with multiple integrated functionalities such as on-skin haptic doodles, mixed reality haptics and visual-haptic displays

    Concevoir l'interaction pour la navigation dans des collections de contenu media (par similarité)

    No full text
    Sound designers source sounds in massive and heavily tagged collections. When searching for media content, once queries are filtered by keywords, hundreds of items are left to be reviewed. How can we present these results efficiently?This doctoral work aims at improving the usability of browsers of media collections by blending techniques from multimedia information retrieval (MIR) and human-computer interaction (HCI). We produced an in-depth state-of-the-art on media browsers. We overviewed HCI and MIR techniques that support our work: organization by content-based similarity (MIR), information visualization and gestural interaction (HCI). We developed the MediaCycle framework for organization by content-based similarity and the DeviceCycle toolbox for rapid prototyping of gestural interaction, both facilitated the design of several media browsers. We evaluated the usability of some of our media browsers.Our main contribution is AudioMetro, an interactive visualization of sound collections. Sounds are represented by content-based glyphs, mapping perceptual sharpness (audio) to brightness and contour (visual). These glyphs are positioned in a starfield display using Student t-distributed Stochastic Neighbor Embedding (tSNE) for dimension reduction, then a proximity grid optimized for preserving direct neighbors. Known-item search evaluation shows that our technique significantly outperforms a grid of sounds represented by dots and ordered by filename.Les illustrateurs sonores puisent des sons dans de gigantesques collections annotées. Lors de la recherche de matériau sonore, quand les mots-clefs ne suffisent plus à affiner chaque requête, des centaines d'éléments restent à examiner. Comment présenter ces résultats efficacement?Cette recherche doctorale vise à améliorer l'utilisabilité d'outils de navigation dans des collections de contenu media en combinant des techniques issues de l'interaction humain-machine (IHM) et de fouille multimedia (MIR). Nous avons produit un état de l'art étendu des navigateurs de contenu media. Nous avons recensé les techniques d'IHM et MIR qui supportent méthodologiquement nos travaux. Nous avons développé MediaCycle pour l'organisation de contenu media par similarité basée sur le signal et DeviceCycle pour le prototypage rapide d'interaction gestuelle qui ont facilité la conception de navigateurs media. Nous avons évalué l'utilisabilité de certains de nos navigateurs media.Notre principale contribution est AudioMetro, une visualisation de collections de sons. Chaque son est représenté par un glyphe dont la luminosité et le contour sont associés à l'acuité acoustique. Ces glyphes sont positionnés dans une représentation 2D par réduction de dimension de caractéristiques du signal audio puis application d'une grille de proximité optimisée pour préserver les voisins directs. Une évaluation utilisateur par recherche de cibles connues a montré que notre technique est plus efficace qu'une grille de points agencés par ordre de lecture des noms de fichiers

    Force-feedback (rotary) audio browsing

    No full text
    International audienceA subset of not so new interfaces for musical expression have been traditionally employed in an artistic and scientific field related to and generative of computer music: physical/tangible controls for media browsing. Cyclic representations of time might have been the motivation for the use of rotary control for temporal media (audio and video). Rotary controls have been widely used by experts in audio edition and video montage even before their systems were computerized, with passive proprioceptive and kinesthetic feedback (on hands) limited by the physical controls during their design and fabrication. Why are there no cost-effective commercial devices for force-feedback rotary control widely available now for digital systems, with user-definable mappings, besides the upcoming Microsoft Surface Dial? Can we just make one from off-the-shelf and repurposed components? This talk will start with a short overview of past personal projects on tangible-to-force-feedback media browsing. The core of the talk is to provide a log reporting hands-on attempts in replicating interaction techniques for force-feedback audio browsing from seminal papers, towards a "hello world" tutorial, using a recent low-cost opensource and openhardware servo motor project (MechaDuino) and a fork of a visual programming environment dedicated for audio/control dataflow (PurrData out of PureData) that had already been used for prototyping force-feedback and music applications

    Force-feedback (rotary) audio browsing

    No full text
    International audienceA subset of not so new interfaces for musical expression have been traditionally employed in an artistic and scientific field related to and generative of computer music: physical/tangible controls for media browsing. Cyclic representations of time might have been the motivation for the use of rotary control for temporal media (audio and video). Rotary controls have been widely used by experts in audio edition and video montage even before their systems were computerized, with passive proprioceptive and kinesthetic feedback (on hands) limited by the physical controls during their design and fabrication. Why are there no cost-effective commercial devices for force-feedback rotary control widely available now for digital systems, with user-definable mappings, besides the upcoming Microsoft Surface Dial? Can we just make one from off-the-shelf and repurposed components? This talk will start with a short overview of past personal projects on tangible-to-force-feedback media browsing. The core of the talk is to provide a log reporting hands-on attempts in replicating interaction techniques for force-feedback audio browsing from seminal papers, towards a "hello world" tutorial, using a recent low-cost opensource and openhardware servo motor project (MechaDuino) and a fork of a visual programming environment dedicated for audio/control dataflow (PurrData out of PureData) that had already been used for prototyping force-feedback and music applications
    • …
    corecore